Listing 1 - 3 of 3 |
Sort by
|
Choose an application
The once esoteric idea of embedding scientific computing into a probabilistic framework, mostly along the lines of the Bayesian paradigm, has recently enjoyed wide popularity and found its way into numerous applications. This book provides an insider’s view of how to combine two mature fields, scientific computing and Bayesian inference, into a powerful language leveraging the capabilities of both components for computational efficiency, high resolution power and uncertainty quantification ability. The impact of Bayesian scientific computing has been particularly significant in the area of computational inverse problems where the data are often scarce or of low quality, but some characteristics of the unknown solution may be available a priori. The ability to combine the flexibility of the Bayesian probabilistic framework with efficient numerical methods has contributed to the popularity of Bayesian inversion, with the prior distribution being the counterpart of classical regularization. However, the interplay between Bayesian inference and numerical analysis is much richer than providing an alternative way to regularize inverse problems, as demonstrated by the discussion of time dependent problems, iterative methods, and sparsity promoting priors in this book. The quantification of uncertainty in computed solutions and model predictions is another area where Bayesian scientific computing plays a critical role. This book demonstrates that Bayesian inference and scientific computing have much more in common than what one may expect, and gradually builds a natural interface between these two areas.
Bayesian statistical decision theory. --- Inverse problems (Differential equations) --- Numerical solutions. --- Numerical analysis --- Bayes' solution --- Bayesian analysis --- Statistical decision --- Estadística bayesiana --- Problemes inversos (Equacions diferencials) --- Solucions numèriques
Choose an application
Focusing on Bayesian methods and maximum entropy, this book shows how a few fundamental rules can be used to tackle a variety of problems in data analysis. Topics covered include reliability analysis, multivariate optimisation, least-squares and maximum likelihood, and more.
Maximum entropy method. --- Maximum principles (Mathematics) --- Engineering mathematics. --- Science --- Engineering --- Engineering analysis --- Mathematical analysis --- Differential equations, Partial --- Entropy maximization --- Entropy maximum principle --- Maximization, Entropy --- Entropy (Information theory) --- Mathematics. --- Mathematics --- Numerical solutions
Choose an application
An insightful investigation into the mechanisms underlying the predictive functions of neural networks--and their ability to chart a new path for AI. Prediction is a cognitive advantage like few others, inherently linked to our ability to survive and thrive. Our brains are awash in signals that embody prediction. Can we extend this capability more explicitly into synthetic neural networks to improve the function of AI and enhance its place in our world Gradient Expectations is a bold effort by Keith L. Downing to map the origins and anatomy of natural and artificial neural networks to explore how, when designed as predictive modules, their components might serve as the basis for the simulated evolution of advanced neural network systems. Downing delves into the known neural architecture of the mammalian brain to illuminate the structure of predictive networks and determine more precisely how the ability to predict might have evolved from more primitive neural circuits. He then surveys past and present computational neural models that leverage predictive mechanisms with biological plausibility, identifying elements, such as gradients, that natural and artificial networks share. Behind well-founded predictions lie gradients, Downing finds, but of a different scope than those that belong to today's deep learning. Digging into the connections between predictions and gradients, and their manifestation in the brain and neural networks, is one compelling example of how Downing enriches both our understanding of such relationships and their role in strengthening AI tools. Synthesizing critical research in neuroscience, cognitive science, and connectionism, Gradient Expectations offers unique depth and breadth of perspective on predictive neural-network models, including a grasp of predictive neural circuits that enables the integration of computational models of prediction with evolutionary algorithms.
Neural networks (Computer science) --- Artificial neural networks --- Nets, Neural (Computer science) --- Networks, Neural (Computer science) --- Neural nets (Computer science) --- Artificial intelligence --- Natural computation --- Soft computing --- Deep learning (Machine learning) --- Conjugate gradient methods. --- Gradient methods, Conjugate --- Approximation theory --- Equations --- Iterative methods (Mathematics) --- Learning, Deep (Machine learning) --- Machine learning --- Numerical solutions
Listing 1 - 3 of 3 |
Sort by
|